The least favorable noise

نویسندگان

چکیده

Suppose that a random variable X of interest is observed perturbed by independent additive noise Y. This paper concerns the “the least favorable perturbation” Yˆε, which maximizes prediction error E(X−E(X|X+Y))2 in class Y with var(Y)≤ε. We find characterization answer to this question, and show example it can be surprisingly complicated. However, special case where infinitely divisible, solution complete simple. also explore conjecture noisier makes worse.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The Structure of Least-Favorable Noise in Gaussian Vector Broadcast Channels

The sum capacity of the Gaussian vector broadcast channel is the saddle point of a Gaussian mutual information game in which the transmitter maximizes the mutual information by choosing the best transmit covariance matrix subject to a power constraint, and the receiver minimizes the mutual information by choosing a least-favorable noise covariance matrix subject to a diagonal constraint. This r...

متن کامل

Gaussian Assumption: the Least Favorable but the Most Useful

Gaussian assumption is the most well-known and widely used distribution in many fields such as engineering, statistics and physics. One of the major reasons why the Gaussian distribution has become so prominent is because of the Central Limit Theorem (CLT) and the fact that the distribution of noise in numerous engineering systems is well captured by the Gaussian distribution. Moreover, feature...

متن کامل

On least favorable configurations for step-up-down tests

This paper investigates an open issue related to false discovery rate (FDR) control of step-up-down (SUD) multiple testing procedures. It has been established in earlier literature that for this type of procedure, under some broad conditions, and in an asymptotical sense, the FDR is maximum when the signal strength under the alternative is maximum. In other words, so-called “Dirac uniform confi...

متن کامل

Jeffreys ’ prior is asymptotically least favorable under entropy risk

We provide a rigorous proof that Jeffreys’ prior asymptotically maximizes Shannon’s mutual information between a sample of size n and the parameter. This was conjectured by Bernard0 (1979) and, despite the absence of a proof, forms the basis of the reference prior method in Bayesian statistical analysis. Our proof rests on an examination of large sample decision theoretic properties associated ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Electronic Communications in Probability

سال: 2022

ISSN: ['1083-589X']

DOI: https://doi.org/10.1214/22-ecp467